Skip to content

Conversation

@math625f
Copy link
Contributor

@math625f math625f commented Oct 24, 2025

What:

  • Bug Fix
  • New Feature

Description:

Currently the token usage is not included in the resulting streamed chunks from the completions API. This PR ensures that the usage is set if it is included in the response.

@iBotPeaches
Copy link
Collaborator

Give Pint a run to format this CI and I believe its good to go. I'll just double check if the ResponseUsage can handle all the missing/nulls that will be provided here on the legacy completion endpoint.

@iBotPeaches
Copy link
Collaborator

Is this not an OpenAI change? I updated my test scripts and ran this and usage was null every single time. If it is, may you point out the flaw in my sample so I can test this?

        $stream = OpenAI::completions()->createStreamed([
            'model' => 'gpt-3.5-turbo-instruct',
            'prompt' => 'This is a test',
        ]);

        foreach ($stream as $response) {
            dump(json_encode($response->toArray()));
        }
➜  openai-test git:(master) ✗ php artisan app:completion-streamed-test
"{"id":"cmpl-CUiU1TuLOiDBBaCd9SkzI89g0jrpw","object":"text_completion","created":1761436869,"model":"gpt-3.5-turbo-instruct:20230824-v2","choices":[{"text":" ","index":0,"logprobs":null,"finish_reason":null}],"usage":null}" // app/Console/Commands/CompletionStreamedTest.php:21                                                                                                                                                                                   
"{"id":"cmpl-CUiU1TuLOiDBBaCd9SkzI89g0jrpw","object":"text_completion","created":1761436869,"model":"gpt-3.5-turbo-instruct:20230824-v2","choices":[{"text":"3","index":0,"logprobs":null,"finish_reason":null}],"usage":null}" // app/Console/Commands/CompletionStreamedTest.php:21                                                                                                                                                                                   
"{"id":"cmpl-CUiU1TuLOiDBBaCd9SkzI89g0jrpw","object":"text_completion","created":1761436869,"model":"gpt-3.5-turbo-instruct:20230824-v2","choices":[{"text":"\");\n","index":0,"logprobs":null,"finish_reason":null}],"usage":null}" // app/Console/Commands/CompletionStreamedTest.php:21
"{"id":"cmpl-CUiU1TuLOiDBBaCd9SkzI89g0jrpw","object":"text_completion","created":1761436869,"model":"gpt-3.5-turbo-instruct:20230824-v2","choices":[{"text":"}","index":0,"logprobs":null,"finish_reason":null}],"usage":null}" // app/Console/Commands/CompletionStreamedTest.php:21                                                                                                                                                                                   
"{"id":"cmpl-CUiU1TuLOiDBBaCd9SkzI89g0jrpw","object":"text_completion","created":1761436869,"model":"gpt-3.5-turbo-instruct:20230824-v2","choices":[{"text":"","index":0,"logprobs":null,"finish_reason":"stop"}],"usage":null}" // app/Console/Commands/CompletionStreamedTest.php:21 

@math625f
Copy link
Contributor Author

math625f commented Oct 27, 2025

Did you try with

...
'stream_options' => [
    'include_usage' => true
]
...

?

If that still doesn't include the usage, it could be a vllm specific thing - I'm using that, not the openai API.

@math625f
Copy link
Contributor Author

I just tested with openai, these are the last few chunks, if I set include_usage to true:

data: {"id":"cmpl-CVAu61NQidetdFlK6PYvY0U5Gib74","object":"text_completion","created":1761546118,"choices":[{"text":" you","index":0,"logprobs":null,"finish_reason":"length"}],"model":"gpt-3.5-turbo-instruct:20230824-v2","usage":null}

data: {"id":"cmpl-CVAu61NQidetdFlK6PYvY0U5Gib74","object":"text_completion","created":1761546118,"choices":[{"text":"","index":0,"logprobs":null,"finish_reason":"length"}],"model":"gpt-3.5-turbo-instruct:20230824-v2","usage":null}

data: {"id":"cmpl-CVAu61NQidetdFlK6PYvY0U5Gib74","object":"text_completion","created":1761546118,"model":"gpt-3.5-turbo-instruct:20230824-v2","usage":{"prompt_tokens":3,"completion_tokens":16,"total_tokens":19},"choices":[]}

data: [DONE]

My json body:

{
  "model": "gpt-3.5-turbo-instruct",
  "prompt": "Thisisatest",
  "stream": true,
  "stream_options": {
    "include_usage": true
  }
}

@iBotPeaches
Copy link
Collaborator

Did you try with

Thanks - this was it.

@iBotPeaches iBotPeaches merged commit cb49b9f into openai-php:main Oct 27, 2025
12 checks passed
@iBotPeaches iBotPeaches added this to the v0.18.0 milestone Oct 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants